AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
English Language Model

# English Language Model

Olmo 2 0425 1B
Apache-2.0
OLMo 2 1B is the smallest model in the open language model series released by the Allen Institute for Artificial Intelligence, based on OLMo-mix-1124 pre-training and further trained with the Dolmino-mix-1124 dataset during the intermediate training phase.
Large Language Model Transformers English
O
allenai
13.31k
45
Gemma 2 Ataraxy V4d 9B
The most comprehensive all-rounder in the Ataraxy series, primarily enhancing creative writing capabilities while excelling in general scenarios
Large Language Model Transformers English
G
lemon07r
236
16
Relullama 7B
A ReLU-activated sparse large language model fine-tuned based on Llama 2 7B, improving computational efficiency through dynamic parameter selection
Large Language Model Transformers English
R
SparseLLM
5,323
11
Tinyllama 1.1B Step 50K 105b
Apache-2.0
TinyLlama is a 1.1B parameter Llama model, planned to be pretrained on 3 trillion tokens, optimized to complete training in 90 days on 16 A100-40G GPUs.
Large Language Model Transformers English
T
TinyLlama
14.41k
133
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase